1,279 research outputs found

    Fast Entropy Estimation for Natural Sequences

    Full text link
    It is well known that to estimate the Shannon entropy for symbolic sequences accurately requires a large number of samples. When some aspects of the data are known it is plausible to attempt to use this to more efficiently compute entropy. A number of methods having various assumptions have been proposed which can be used to calculate entropy for small sample sizes. In this paper, we examine this problem and propose a method for estimating the Shannon entropy for a set of ranked symbolic natural events. Using a modified Zipf-Mandelbrot-Li law and a new rank-based coincidence counting method, we propose an efficient algorithm which enables the entropy to be estimated with surprising accuracy using only a small number of samples. The algorithm is tested on some natural sequences and shown to yield accurate results with very small amounts of data

    Planning for Excellence: Insights from an International Review of Regulators’ Strategic Plans

    Get PDF
    What constitutes regulatory excellence? Answering this question is an indispensable first step for any public regulatory agency that is measuring, striving towards, and, ultimately, achieving excellence. One useful way to answer this question would be to draw on the broader literature on regulatory design, enforcement, and management. But, perhaps a more authentic way would be to look at how regulators themselves define excellence. However, we actually know remarkably little about how the regulatory officials who are immersed in the task of regulation conceive of their own success. In this Article, we investigate regulators’ definitions of regulatory excellence by drawing on a unique source of data that provides an important window on regulators’ own aspirations: their strategic plans. Strategic plans have been required or voluntarily undertaken for the past decade or longer by regulators around the globe. In these plans, regulators offer mission statements, strategic goals, and measurable and achievable outcomes, all of which indicate what regulators value and are striving to become. Occasionally, they even state explicitly where they have fallen short of “best-in-class” status and how they intend to improve. To date, a voluminous literature exists examining agency practices in strategic planning, but we are aware of no study that tries to glean from the substance of a sizeable number of plans how regulators themselves construe regulatory excellence. The main task of this Article is undertaking this effort. This Article draws on twenty plans from different regulators in nine countries. We found most generally that excellent regulators describe themselves (though not necessarily using exactly these words) as institutions that are more (1) efficient, (2) educative, (3) multiplicative, (4) proportional, (5) vital, (6) just, and (7) honest. In addition to these seven shared attribute categories, our reading of the plans also revealed five other “unusual” attributes that only one or two agencies mentioned. Beyond merely cataloguing the attributes identified by agencies, this Article also discusses commonalities (and differences) between plan structures, emphases, and framings. We found that the plans differed widely in features such as the specificity of their mission statements, the extent to which they emphasized actions over outcomes (or vice versa), and the extent to which commitments were organized along organizational fiefdoms or cut across bureaucratic lines. We urge future scholarship to explore alternative methods of text mining, and to study strategic plans over time within agencies, in order to track how agencies’ notions of regulatory excellence respond to changes in the regulatory context and the larger circumstances within which agencies operate. Looking longitudinally will also shed light on how agencies handle strategic goals that are either met or that prove to be unattainable

    Fast and scalable Gaussian process modeling with applications to astronomical time series

    Full text link
    The growing field of large-scale time domain astronomy requires methods for probabilistic data analysis that are computationally tractable, even with large datasets. Gaussian Processes are a popular class of models used for this purpose but, since the computational cost scales, in general, as the cube of the number of data points, their application has been limited to small datasets. In this paper, we present a novel method for Gaussian Process modeling in one-dimension where the computational requirements scale linearly with the size of the dataset. We demonstrate the method by applying it to simulated and real astronomical time series datasets. These demonstrations are examples of probabilistic inference of stellar rotation periods, asteroseismic oscillation spectra, and transiting planet parameters. The method exploits structure in the problem when the covariance function is expressed as a mixture of complex exponentials, without requiring evenly spaced observations or uniform noise. This form of covariance arises naturally when the process is a mixture of stochastically-driven damped harmonic oscillators -- providing a physical motivation for and interpretation of this choice -- but we also demonstrate that it can be a useful effective model in some other cases. We present a mathematical description of the method and compare it to existing scalable Gaussian Process methods. The method is fast and interpretable, with a range of potential applications within astronomical data analysis and beyond. We provide well-tested and documented open-source implementations of this method in C++, Python, and Julia.Comment: Updated in response to referee. Submitted to the AAS Journals. Comments (still) welcome. Code available: https://github.com/dfm/celerit

    Auto-bandwidth control in dynamically reconfigured hybrid-SDN MPLS networks

    Get PDF
    The proposition of this work is based on the steady evolution of bandwidth demanding technology, which currently and more so in future, requires operators to use expensive infrastructure capability smartly to maximise its use in a very competitive environment. In this thesis, a traffic engineering control loop is proposed that dynamically adjusts the bandwidth and route of flows of Multi-Protocol Label Switching (MPLS) tunnels in response to changes in traffic demand. Available bandwidth is shifted to where the demand is, and where the demand requirement has dropped, unused allocated bandwidth is returned to the network. An MPLS network enhanced with Software-defined Networking (SDN) features is implemented. The technology known as hybrid SDN combines the programmability features of SDN with the robust MPLS label switched path features along with traffic engineering enhancements introduced by routing protocols such as Border Gateway Patrol-Traffic Engineering (BGP-TE) and Open Shortest Path First-Traffic Engineering (OSPF-TE). The implemented mixed-integer linear programming formulation using the minimisation of maximum link utilisation and minimum link cost objective functions, combined with the programmability of the hybrid SDN network allows for source to destination demand fluctuations. A key driver to this research is the programmability of the MPLS network, enhanced by the contributions that the SDN controller technology introduced. The centralised view of the network provides the network state information needed to drive the mathematical modelling of the network. The path computation element further enables control of the label switched path's bandwidths, which is adjusted based on current demand and optimisation method used. The hose model is used to specify a range of traffic conditions. The most important benefit of the hose model is the flexibility that is allowed in how the traffic matrix can change if the aggregate traffic demand does not exceed the hose maximum bandwidth specification. To this end, reserved hose bandwidth can now be released to the core network to service demands from other sites

    Inferring probabilistic stellar rotation periods using Gaussian processes

    Full text link
    Variability in the light curves of spotted, rotating stars is often non-sinusoidal and quasi-periodic --- spots move on the stellar surface and have finite lifetimes, causing stellar flux variations to slowly shift in phase. A strictly periodic sinusoid therefore cannot accurately model a rotationally modulated stellar light curve. Physical models of stellar surfaces have many drawbacks preventing effective inference, such as highly degenerate or high-dimensional parameter spaces. In this work, we test an appropriate effective model: a Gaussian Process with a quasi-periodic covariance kernel function. This highly flexible model allows sampling of the posterior probability density function of the periodic parameter, marginalising over the other kernel hyperparameters using a Markov Chain Monte Carlo approach. To test the effectiveness of this method, we infer rotation periods from 333 simulated stellar light curves, demonstrating that the Gaussian process method produces periods that are more accurate than both a sine-fitting periodogram and an autocorrelation function method. We also demonstrate that it works well on real data, by inferring rotation periods for 275 Kepler stars with previously measured periods. We provide a table of rotation periods for these 1132 Kepler objects of interest and their posterior probability density function samples. Because this method delivers posterior probability density functions, it will enable hierarchical studies involving stellar rotation, particularly those involving population modelling, such as inferring stellar ages, obliquities in exoplanet systems, or characterising star-planet interactions. The code used to implement this method is available online.Comment: Submitted to MNRAS. Replaced 27/06/2017: corrections made to koi_periods.cs

    Rapid evolution of metabolic traits explains thermal adaptation in phytoplankton

    Get PDF
    Understanding the mechanisms that determine how phytoplankton adapt to warming will substantially improve the realism of models describing ecological and biogeochemical effects of climate change. Here, we quantify the evolution of elevated thermal tolerance in the phytoplankton, Chlorella vulgaris. Initially, population growth was limited at higher temperatures because respiration was more sensitive to temperature than photosynthesis meaning less carbon was available for growth. Tolerance to high temperature evolved after ≈ 100 generations via greater down-regulation of respiration relative to photosynthesis. By down-regulating respiration, phytoplankton overcame the metabolic constraint imposed by the greater temperature sensitivity of respiration and more efficiently allocated fixed carbon to growth. Rapid evolution of carbon-use efficiency provides a potentially general mechanism for thermal adaptation in phytoplankton and implies that evolutionary responses in phytoplankton will modify biogeochemical cycles and hence food web structure and function under warming. Models of climate futures that ignore adaptation would usefully be revisited

    Plasticity in transmission strategies of the malaria parasite, Plasmodium chabaudi : environmental and genetic effects

    Get PDF
    Parasites may alter their behaviour to cope with changes in the within-host environment. In particular, investment in transmission may alter in response to the availability of parasite resources or host immune responses. However, experimental and theoretical studies have drawn conflicting conclusions regarding parasites' optimal (adaptive) responses to deterioration in habitat quality. We analyse data from acute infections with six genotypes of the rodent malaria species to quantify how investment in transmission (gametocytes) is influenced by the within-host environment. Using a minimum of modelling assumptions, we find that proportional investment in gametocytogenesis increases sharply with host anaemia and also increases at low parasite densities. Further, stronger dependence of investment on parasite density is associated with greater virulence of the parasite genotype. Our study provides a robust quantitative framework for studying parasites' responses to the host environment and whether these responses are adaptive, which is crucial for predicting the short-term and evolutionary impact of transmission-blocking treatments for parasitic diseases

    Entry, Descent and Landing Using Ballutes

    Get PDF
    The In Space Propulsion Program is funding a team lead by Kevin Miller at Ball Aerospace. This team of Industry, NASA, and Academic researchers is actively pursuing ballute technology development, with very promising results. The focus of that study has been to maximize the payload that is put into orbit (around Titan, Neptune, and Mars). So far the mass associated with the ballute has been minimized, because it was being thrown away. If an instrument package is attached to the Ballute, it will eventually land on the surface. Thus, the Ballute can do double duty: Aerocapture the Orbiter and Soft-land a set of instruments on the surface
    corecore